AIDE: Fast and Communication Efficient Distributed Optimization
نویسندگان
چکیده
In this paper, we present two new communication-efficient methods for distributed minimization of an average of functions. The first algorithm is an inexact variant of the DANE algorithm [20] that allows any local algorithm to return an approximate solution to a local subproblem. We show that such a strategy does not affect the theoretical guarantees of DANE significantly. In fact, our approach can be viewed as a robustification strategy since the method is substantially better behaved than DANE on data partition arising in practice. It is well known that DANE algorithm does not match the communication complexity lower bounds. To bridge this gap, we propose an accelerated variant of the first method, called AIDE, that not only matches the communication lower bounds but can also be implemented using a purely first-order oracle. Our empirical results show that AIDE is superior to other communication efficient algorithms in settings that naturally arise in machine learning applications.
منابع مشابه
Gradient Sparsification for Communication-Efficient Distributed Optimization
Modern large scale machine learning applications require stochastic optimization algorithms to be implemented on distributed computational architectures. A key bottleneck is the communication overhead for exchanging information such as stochastic gradients among different workers. In this paper, to reduce the communication cost we propose a convex optimization formulation to minimize the coding...
متن کاملTitle of thesis : DISTRIBUTED LOAD BALANCING ALGORITHM IN WIRELESS NETWORKS
Title of thesis: DISTRIBUTED LOAD BALANCING ALGORITHM IN WIRELESS NETWORKS Alireza Sheikhattar, Master of Science, 2014 Thesis directed by: Dr. Mehdi Kalantari Department of Electrical and Computer Engineering As communication networks scale up in size, complexity and demand, effective distribution of the traffic load throughout the network is a matter of great importance. Load balancing will e...
متن کاملLimited-memory Common-directions Method for Distributed Optimization and its Application on Empirical Risk Minimization
Distributed optimization has become an important research topic for dealing with extremely large volume of data available in the Internet companies nowadays. Additional machines make computation less expensive, but inter-machine communication becomes prominent in the optimization process, and efficient optimization methods should reduce the amount of the communication in order to achieve shorte...
متن کاملScaling Distributed Machine Learning with System and Algorithm Co-design
For a lot of important machine learning problems, due to the rapid growth of data and the ever increasing model complexity, which often manifests itself in the large number of model parameters, no single machine can solve them fast enough. Therefore, distributed optimization and inference is becoming more and more inevitable for solving large scale machine learning problems in both academia and...
متن کاملMemory and Communication Efficient Distributed Stochastic Optimization with Minibatch Prox
We present and analyze statistically optimal, communication and memory efficient distributed stochastic optimization algorithms with near-linear speedups (up to log-factors). This improves over prior work which includes methods with near-linear speedups but polynomial communication requirements (accelerated minibatch SGD) and communication efficient methods which do not exhibit any runtime spee...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1608.06879 شماره
صفحات -
تاریخ انتشار 2016